九九九精品国产10,欧美久久综合天天,久久国产中文字幕,亚洲国产一区二区三区在线观看

    聯(lián)
    咨詢熱線:

    185-9527-1032

    聯(lián)系QQ:

    2863379292

    官方微信:

    怎樣禁止服務(wù)器爬蟲?

    建站經(jīng)驗(yàn)

    導(dǎo)讀:一、Apache①、通過修改 .htaccess 文件修改網(wǎng)站目錄下的.htaccess,添加如下代碼即可(2 種代碼任選):可用代碼 (1):RewriteEngineOnRewriteCond%{HTTp_USER_AGENT}(^$|FeedDemo

    發(fā)表日期:2019-05-12

    文章編輯:興田科技

    瀏覽次數(shù):13538

    標(biāo)簽:

    一、Apache

    ①、通過修改 .htaccess 文件

    修改網(wǎng)站目錄下的.htaccess,添加如下代碼即可(2 種代碼任選):

    可用代碼 (1):

    RewriteEngineOnRewriteCond%{HTTp_USER_AGENT}(^$|FeedDemon|IndyLibrary|AlexaToolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedparser|ApacheBench|MicrosoftURLControl|Swiftbot|ZmEu|oBot|jaunty|python–urllib|lightDeckReportsBot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms)[NC]RewriteRule^(.*)$–[F]

    可用代碼 (2):

    SetEnvIfNoCase^User–Agent$.*(FeedDemon|IndyLibrary|AlexaToolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedparser|ApacheBench|MicrosoftURLControl|Swiftbot|ZmEu|oBot|jaunty|python–urllib|lightDeckReportsBot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms)BADBOTOrderAllow,DenyAllowfromallDenyfromenv=BADBOT

    ②、通過修改 httpd.conf 配置文件

    找到如下類似位置,根據(jù)以下代碼 新增 / 修改,然后重啟 Apache 即可:

    Shell

    DocumentRoot/home/wwwroot/xxx<Directory“/home/wwwroot/xxx”>SetEnvIfNoCaseUser–Agent“.*(FeedDemon|IndyLibrary|AlexaToolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedparser|ApacheBench|MicrosoftURLControl|Swiftbot|ZmEu|oBot|jaunty|python-urllib|lightDeckReportsBot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms)”BADBOTOrderallow,denyAllowfromalldenyfromenv=BADBOT</Directory>

    怎樣禁止服務(wù)器爬蟲? 如何做網(wǎng)站掙錢

    二、Nginx 代碼

    進(jìn)入到 nginx 安裝目錄下的 conf 目錄,將如下代碼保存為 agent_deny.conf

    cd/usr/local/nginx/confvimagent_deny.conf
    #禁止Scrapy等工具的抓取if($http_user_agent~*(Scrapy|Curl|HttpClient)){return403;}#禁止指定UA及UA為空的訪問if($http_user_agent~*“FeedDemon|IndyLibrary|AlexaToolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedparser|ApacheBench|MicrosoftURLControl|Swiftbot|ZmEu|oBot|jaunty|python-urllib|lightDeckReportsBot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms|^$”){return403;}#禁止非GET|HEAD|pOST方式的抓取if($request_method!~^(GET|HEAD|pOST)$){return403;}

    然后,在網(wǎng)站相關(guān)配置中的 location / { 之后插入如下代碼:

    Shell

    includeagent_deny.conf;

    如下的配置:

    Shell

    [marsge@Mars_Server~]$cat/usr/local/nginx/conf/zhangge.conflocation/{try_files$uri$uri//index.php?$args;#這個(gè)位置新增1行:includeagent_deny.conf;rewrite^/sitemap_360_sp.txt$/sitemap_360_sp.phplast;rewrite^/sitemap_baidu_sp.xml$/sitemap_baidu_sp.phplast;rewrite^/sitemap_m.xml$/sitemap_m.phplast;

    保存后,執(zhí)行如下命令,平滑重啟 nginx 即可:

    Shell

    /usr/local/nginx/sbin/nginx–sreload

    三、pHp 代碼

    將如下方法放到貼到網(wǎng)站入口文件 index.php 中的第一個(gè)

    pHp

    //獲取UA信息$ua=$_SERVER[‘HTTp_USER_AGENT’];//將惡意USER_AGENT存入數(shù)組$now_ua=array(‘FeedDemon‘,‘BOT/0.1(BOTforJCE)’,‘CrawlDaddy‘,‘Java’,‘Feedly’,‘UniversalFeedparser’,‘ApacheBench’,‘Swiftbot’,‘ZmEu’,‘IndyLibrary’,‘oBot’,‘jaunty’,‘YandexBot’,‘AhrefsBot’,‘MJ12bot’,‘WinHttp’,‘EasouSpider’,‘HttpClient’,‘MicrosoftURLControl’,‘YYSpider’,‘jaunty’,‘python-urllib’,‘lightDeckReportsBot’);//禁止空USER_AGENT,dedecms等主流采集程序都是空USER_AGENT,部分sql注入工具也是空USER_AGENTif(!$ua){header(“Content-type:text/html;charset=utf-8”);die(‘請(qǐng)勿采集本站,因?yàn)椴杉恼鹃L木有小JJ!’);}else{foreach($now_uaas$value)//判斷是否是數(shù)組中存在的UAif(eregi($value,$ua)){header(“Content-type:text/html;charset=utf-8”);die(‘請(qǐng)勿采集本站,因?yàn)椴杉恼鹃L木有小JJ!’);}}

    四、測(cè)試效果

    如果是 VpS,那非常簡單,使用 curl -A 模擬抓取即可,比如:

    模擬宜搜蜘蛛抓?。?/p>

    Shell

    curl–I–A‘YisouSpider’bizhi.bcoderss.com

    模擬 UA 為空的抓?。?/p>

    Shell

    curl–I–A”bizhi.bcoderss.com

    模擬百度蜘蛛的抓?。?/p>

    Shell

    curl–I–A‘Baiduspider’bizhi.bcoderss.com

    相關(guān)推薦

    更多新聞